Nanodegree key: nd209
Version: 1.0.0
Locale: en-us
Fuse computer vision, machine learning, mechanics, and hardware systems to build bots of the future!
Content
Part 01 : Term 1: ROS Essentials, Perception, and Control
In this module, you'll get an introduction to your Nanodegree program and obtain a comprehensive overview of the field that is Robotics. You'll also build your first project, modeled after the NASA Mars Rover Challenge.
-
Module 01: Introduction to Robotics
-
Lesson 01: Welcome
In this first lesson, you'll meet your instructors, learn about the structure of this program and about the services available to you as a student.
-
Lesson 02: What is a Robot?
Ask three people what a robot is and you'll get three different answers! Here we ask your instructors, three expert roboticists from Electric Movement.
-
Lesson 03: Search and Sample Return
In this lesson, you'll learn the skills you need to tackle the first project, where you'll experience the three essential elements of robotics -- perception, decision making, and actuation.
- Concept 01: Overview of the Project
- Concept 02: Explore the Simulator
- Concept 03: Telemetry and Record Data
- Concept 04: Jupyter Notebook
- Concept 05: Workspace Playground
- Concept 06: The Perception Step
- Concept 07: Perspective Transform
- Concept 08: Warp, Threshold, & Map
- Concept 09: Map to World Coordinates
- Concept 10: Decision: Where to Go?
- Concept 11: Test Your Methods
- Concept 12: Online: Rover Lab
- Concept 13: Environment Setup (Local)
- Concept 14: More Decisions
- Concept 15: Take Action!
- Concept 16: Autonomous Mode
- Concept 17: Requirements & Challenges
- Concept 18: Common Questions
-
Lesson 04: Career Support Overview
As you learn the skills you’ll need in order to work in the robotics industry, you’ll see Career Lessons and Projects that will help you improve your online professional profiles.
-
Lesson 05: Get Help from Peers and Mentors
You are starting a challenging journey. Take 3 minutes to read how to get help with projects and content.
-
Lesson 06: Get Help with Your Account
What to do if you have questions about your account or general questions about the program.
-
-
Module 02: ROS Essentials
-
Module 03: GitHub Lesson
-
Lesson 01: GitHub
Build a GitHub Profile on par with senior software engineers.
- Concept 01: Introduction
- Concept 02: GitHub profile important items
- Concept 03: Good GitHub repository
- Concept 04: Interview with Art - Part 1
- Concept 05: Identify fixes for example “bad” profile
- Concept 06: Quick Fixes #1
- Concept 07: Quick Fixes #2
- Concept 08: Writing READMEs with Walter
- Concept 09: Interview with Art - Part 2
- Concept 10: Commit messages best practices
- Concept 11: Reflect on your commit messages
- Concept 12: Participating in open source projects
- Concept 13: Interview with Art - Part 3
- Concept 14: Participating in open source projects 2
- Concept 15: Starring interesting repositories
- Concept 16: Next Steps
-
-
Module 04: Udacity Explores - Biologically Inspired Robots
-
Lesson 01: Udacity Explores - Biologically Inspired Robots
This week you'll learn about biologically inspired robots, and why studying these robotics may unlock the the potential of future robotic systems.
- Concept 01: Welcome
- Concept 02: Meet Arpan Chakraborty
- Concept 03: Actuation
- Concept 04: Vision
- Concept 05: Control
- Concept 06: READ: Research Papers
- Concept 07: Biologically Inspired Approaches
- Concept 08: Robotic Insect
- Concept 09: Cheetah Locomotion
- Concept 10: OpenRatSLAM
- Concept 11: WATCH: Concepts in Action
- Concept 12: DO: Lab
- Concept 13: Lab: Configuration
- Concept 14: Lab: Challenge
-
Lesson 02: 6 Questions on Robotics Careers
Robotics is an interdisciplinary field, which means there are different jobs opportunities depending on your background and interests. Discover options to a Robotics career.
-
-
Module 05: Kinematics
-
Lesson 01: Intro to Kinematics
In this lesson you'll get an introduction to the exciting field of kinematics and learn about the most important types of serial manipulators used in the robotics industry.
- Concept 01: Overview
- Concept 02: Intro to Kinematics
- Concept 03: Degrees of Freedom
- Concept 04: Two DoF Arm
- Concept 05: Generalized Coordinates
- Concept 06: Rigid Bodies in Free Space
- Concept 07: Joint Types
- Concept 08: Principal Types of Serial Manipulators
- Concept 09: Serial Manipulator Applications
- Concept 10: Quiz: Reachable Workspace
-
Lesson 02: Forward and Inverse Kinematics
Here you'll dive deep into the details of solving the forward and inverse kinematics problem for serial manipulators.
- Concept 01: Setting up the Problem
- Concept 02: Coordinate Frames and Vectors
- Concept 03: Rotation Matrices in 2D
- Concept 04: Rotation Matrices in 3D
- Concept 05: Quiz: Build a Rotation Matrix
- Concept 06: Rotations in Sympy
- Concept 07: Composition of Rotations
- Concept 08: Euler Angles from a Rotation Matrix
- Concept 09: Translations
- Concept 10: Homogeneous Transforms and their Inverse
- Concept 11: Composition of Homogeneous Transforms
- Concept 12: Denavit-Hartenberg Parameters
- Concept 13: DH Parameter Assignment Algorithm
- Concept 14: DH Steps 1-4
- Concept 15: DH Steps 5-8
- Concept 16: DH Parameter Table
- Concept 17: Forward Kinematics
- Concept 18: Inverse Kinematics
- Concept 19: Inverse Kinematics Example
- Concept 20: Wrap Up
-
Lesson 03: Project: Robotic Arm: Pick & Place
In this lesson you will learn how to control a robotic arm with six degrees of freedom to perform pick and place actions using Inverse Kinematics.
- Concept 01: Overview
- Concept 02: Setting up your environment
- Concept 03: Introduction to Project Tools
- Concept 04: Gazebo Basics
- Concept 05: Exploring Gazebo User Interface
- Concept 06: Understanding Unified Robot Description Format (URDF)
- Concept 07: RViz Basics
- Concept 08: Moveit! Basics
- Concept 09: Pick and Place Demo Walkthrough
- Concept 10: KR210 Forward Kinematics 1
- Concept 11: KR210 Forward Kinematics 2
- Concept 12: KR210 Forward Kinematics 3
- Concept 13: Forward Kinematics with Kuka KR210
- Concept 14: Debugging Forward Kinematics
- Concept 15: Inverse Kinematics with Kuka KR210
- Concept 16: Creating IK Solver for Kuka KR210
- Concept 17: Debugging Inverse Kinematics and Optimization
- Concept 18: Requirements and Challenge
- Concept 19: Common Questions
- Concept 20: Project Guide
-
-
Module 06: Udacity Explores - HRI / Robot Ethics
-
Lesson 01: Udacity Explores - Human Robot Interaction & Robot Ethics
Learn about some of the technical and ethical challenges when robots co-exist with humans.
- Concept 01: Introduction
- Concept 02: READ: Research Papers
- Concept 03: Uncanny Valley
- Concept 04: Designing Robot Heads
- Concept 05: Human Trajectory Prediction in Crowded Spaces
- Concept 06: Robot Ethics
- Concept 07: Robot Ethics from Philosophy of Science
- Concept 08: WATCH: Concepts in Action
- Concept 09: Meet Andra Keay
- Concept 10: Exclusive: Silicon Valley Robotics
- Concept 11: Meet Cory Kidd
- Concept 12: Exclusive: Catalia Health
- Concept 13: Meet Kaijen Hsiao
- Concept 14: Exclusive: Mayfield Robotics
- Concept 15: DO: Lab - Build a Robot Prototype Under $25
- Concept 16: Feedback Survey
-
Lesson 02: Product Pitch
After you've built your prototype, it's time to talk about it. Learn how to best describe a robot prototype and communicate your goals and knowledge.
-
-
Module 07: Perception
-
Lesson 01: Perception Overview
Here's a quick look at what to expect in the upcoming lessons and project.
-
Lesson 02: Introduction to 3D Perception
Dive into the world of perception in three dimensions! After a brief tour of 3D sensors used in robotics we'll explore the capabilities of RGB-D cameras, which you'll use in these lessons.
- Concept 01: 3D Perception
- Concept 02: Active Sensors
- Concept 03: Passive Sensors
- Concept 04: RGB-D Cameras
- Concept 05: Sensor Quiz
- Concept 06: Why RGB-D?
- Concept 07: RGB Image Formation
- Concept 08: Adding Depth
- Concept 09: What is a Point Cloud?
- Concept 10: Point Cloud Types
- Concept 11: Creating a Point Cloud from RGB-D Data
- Concept 12: 3D Perception Summary
-
Lesson 03: Calibration, Filtering, and Segmentation
To understand your sensor data you first need to calibrate! Here you'll get a handle on RGB-D camera calibration and how to do filtering and basic segmentation on your point cloud data.
- Concept 01: Intro to Calibration, Filtering, and Segmentation
- Concept 02: Sensor Calibration
- Concept 03: RGB Camera Model
- Concept 04: Calibration Pattern
- Concept 05: OpenCV Calibration
- Concept 06: Extrinsic Calibration
- Concept 07: RGBD Calibration in ROS
- Concept 08: Point Cloud Filtering
- Concept 09: Tabletop Segmentation Exercise
- Concept 10: Voxel Grid Downsampling
- Concept 11: Pass Through Filtering
- Concept 12: Segmentation in Perception
- Concept 13: RANSAC Overview
- Concept 14: RANSAC Plane Fitting
- Concept 15: Extracting Indices
- Concept 16: Outlier Removal Filter
- Concept 17: Summary
-
Lesson 04: Clustering for Segmentation
Clustering is a powerful machine learning method for segmenting objects of any arbitrary shape in your point cloud data. Here you'll compare K-means and Euclidean clustering for object segmentation.
- Concept 01: Object Segmentation
- Concept 02: Downside of Model Fitting
- Concept 03: Clustering
- Concept 04: K-means Clustering
- Concept 05: K-means Playground
- Concept 06: K-means quizzes
- Concept 07: DBSCAN Algorithm
- Concept 08: Comparing DBSCAN and k-means Clustering
- Concept 09: Clustering with PCL
- Concept 10: Publish Your Point Cloud
- Concept 11: Filtering and RANSAC
- Concept 12: Clustering Objects
- Concept 13: Cluster Visualization
- Concept 14: Segmentation Summary
-
Lesson 05: Object Recognition
In this lesson, you'll take your segmented point cloud and isolate features you can use to train a machine learning algorithm to recognize the object you're looking for!
- Concept 01: Intro to Object Recognition
- Concept 02: Features
- Concept 03: Feature Intuition
- Concept 04: Color Spaces
- Concept 05: HSV Intuitions
- Concept 06: Color Histograms
- Concept 07: Histogram in Action
- Concept 08: Surface Normals
- Concept 09: Normals Intuition
- Concept 10: Support Vector Machine
- Concept 11: SVM Intuitions
- Concept 12: SVM Image Classification
- Concept 13: Recognition Exercise
- Concept 14: Generate Features
- Concept 15: Train Your SVM
- Concept 16: Improve Your Model
- Concept 17: Object Recognition
- Concept 18: Recognition Wrap-up
- Concept 19: Common Questions
-
Lesson 06: 3D Perception Project
In the project at the end of this lesson, you'll bring together everything you know about perception in three dimensions, from filtering and segmentation to feature extraction and object recognition!
- Concept 01: Project Intro
- Concept 02: Amazon Robotics Challenge
- Concept 03: Environment Setup
- Concept 04: Project Demo
- Concept 05: Perception Pipeline
- Concept 06: Output yaml files
- Concept 07: Requirements and Challenge
- Concept 08: PR2 Collision Avoidance
- Concept 09: Robot Motion
- Concept 10: Common Questions
- Concept 11: Project Summary
-
-
Module 08: Udacity Explores - Soft Robotics
-
Lesson 01: Udacity Explores - Soft Robotics
Learn about the cutting-edge field of soft robotics.
- Concept 01: Introduction
- Concept 02: This Week's Schedule
- Concept 03: READ : Research Papers
- Concept 04: Soft Robotics: A Perspective
- Concept 05: Untethered Soft Robot
- Concept 06: Design and Fabrication
- Concept 07: Combining Hard and Soft Robots
- Concept 08: Self Folding Robots
- Concept 09: Oribotics
- Concept 10: WATCH: Concepts in Action
- Concept 11: Meet Josh Lessing
- Concept 12: Exclusive: Getting Started in Robotics
- Concept 13: Exclusive: What is Soft Robotics?
- Concept 14: Exclusive: What Does Soft Robots Inc. Do?
- Concept 15: Exclusive: Our Process
- Concept 16: Exclusive: Soft Robots and Food
- Concept 17: Exclusive: The Future of Soft Robots
- Concept 18: Exclusive: Getting Started in Soft Robotics
- Concept 19: Meet Rajat Mishra
- Concept 20: Exclusive: Rajat Mishra
- Concept 21: DO : Lab - Make an Earthworm
- Concept 22: DO : Lab - Metamaterials
-
-
Module 09: Udacity Explores - Robot Grasping
-
Lesson 01: Udacity Explores - Robot Grasping
Learn about cutting-edge research and development in robot grasping
- Concept 01: Introduction
- Concept 02: READ : Research Papers
- Concept 03: An Overview of Grasping
- Concept 04: Jamming-based Gripper
- Concept 05: Dex-Net 2.0
- Concept 06: Learning Hand-Eye Grasping
- Concept 07: Slip Detection and Correction
- Concept 08: WATCH : Concepts in Action
- Concept 09: Meet Josh Lessing
- Concept 10: Exclusive: What is Soft Robotics?
- Concept 11: Exclusive: Our Process
- Concept 12: Exclusive: Soft Robots and Food
- Concept 13: DO : Challenge- Build a Gripper
- Concept 14: DO : Lab- Dex-Net 2.0 Code
-
-
Module 10: Controls
-
Lesson 01: Introduction to Controls
In this lesson you'll learn about Controls and how to create and tune PID controllers.
- Concept 01: Overview of Controls Engineering
- Concept 02: Open-Loop Control
- Concept 03: Closed-Loop Control
- Concept 04: PID Overview
- Concept 05: Getting to Know Your Simulator
- Concept 06: Proportional Control 1
- Concept 07: Proportional Control 2
- Concept 08: Proportional Control 3
- Concept 09: Building a P Controller
- Concept 10: PI Control
- Concept 11: Building a PI Controller
- Concept 12: PD Control
- Concept 13: Building a PD Controller
- Concept 14: Building a PID Controller
- Concept 15: PID Control Summary
- Concept 16: Limitations of PID
- Concept 17: Beyond the Ideal Case - Integrator Windup
- Concept 18: Beyond the Ideal Case - Noise
- Concept 19: Control Design Objective and Criteria
- Concept 20: Tuning Strategies 1
- Concept 21: Tuning Strategies 2
- Concept 22: Putting It All Together
- Concept 23: Wrap Up
-
Lesson 02: Quadrotor Control using PID
In this lesson, you'll learn how to control a Quadrotor inside a Unity environment using a PID based Positional Controller within a ROS node.
- Concept 01: Introduction to a Positional Controller
- Concept 02: Quadrotor Kinematic and Dynamic Model 1
- Concept 03: Quadrotor Quiz
- Concept 04: Quadrotor Kinematic and Dynamic Model 2
- Concept 05: Cascade PID Control
- Concept 06: Lab Walkthrough
- Concept 07: Environment Setup
- Concept 08: Exploring the Sim
- Concept 09: Helpful Tools
- Concept 10: Completing PID Controller
- Concept 11: Hover Controller
- Concept 12: dynamic_reconfigure
- Concept 13: Attitude Controller
- Concept 14: Positional Controller
- Concept 15: Lab Summary
- Concept 16: PID Wrap Up
-
-
Module 11: Udacity Explores - Swarm Robotics
-
Lesson 01: Udacity Explores: Swarm Robotics
Learn about swarm robotics and how these robots can be used in areas such as medicine and search and rescue.
- Concept 01: Introduction
- Concept 02: READ : Research Papers
- Concept 03: Swarm Robots in Medicine
- Concept 04: Kilobots
- Concept 05: Search and Rescue
- Concept 06: Self-Assembly Swarm Robots
- Concept 07: WATCH: Concepts in Action
- Concept 08: Meet Sabine Hauert
- Concept 09: Exclusive: Sabine Hauert
- Concept 10: DO : Lab - Create a Swarm
-
-
Module 12: Deep Learning
-
Lesson 01: Intro to Neural Networks
In this lesson, Luis Serrano provides you with a solid foundation for understanding how you build powerful neural networks from the ground up.
- Concept 01: Introducing Luis!
- Concept 02: Intro to Neural Networks
- Concept 03: Classification Problems 1
- Concept 04: Classification Problems 2
- Concept 05: Linear Boundaries
- Concept 06: Perceptrons
- Concept 07: Why "Neural Networks"?
- Concept 08: Perceptrons as Logical Operators
- Concept 09: Perceptron Trick
- Concept 10: Perceptron Algorithm
- Concept 11: Higher Dimensions
- Concept 12: Error Functions
- Concept 13: Log-loss Error Function
- Concept 14: Discrete vs Continuous
- Concept 15: Softmax
- Concept 16: One-Hot Encoding
- Concept 17: Maximum Likelihood
- Concept 18: Maximizing Probabilities
- Concept 19: Cross-Entropy 1
- Concept 20: Cross-Entropy 2
- Concept 21: Multi-Class Cross Entropy
- Concept 22: Logistic Regression
- Concept 23: Gradient Descent
- Concept 24: Logistic Regression Algorithm
- Concept 25: Non-Linear Regions
- Concept 26: Non-Linear Models
- Concept 27: Neural Network Architecture
- Concept 28: Feedforward
- Concept 29: Backpropagation
- Concept 30: Further Reading
- Concept 31: Neural Networks Wrap Up
-
Lesson 02: TensorFlow for Deep Learning
Vincent Vanhoucke, Principal Scientist at Google Brain, introduces you to deep learning and TensorFlow, Google's deep learning framework.
- Concept 01: Introducing Vincent!
- Concept 02: What is Deep Learning?
- Concept 03: Solving Problems - Big and Small
- Concept 04: Let's Get Started!
- Concept 05: Installing Tensorflow
- Concept 06: Hello, Tensor World!
- Concept 07: Transition to Classification
- Concept 08: Classification
- Concept 09: Let's make a deal
- Concept 10: Training Your Logistic Classifier
- Concept 11: Tensorflow Softmax
- Concept 12: TensorFlow Cross-Entropy
- Concept 13: Minimizing Cross Entropy
- Concept 14: Practical Aspects of Learning
- Concept 15: Quiz: Numerical Stability
- Concept 16: Normalized Inputs and Initial Weights
- Concept 17: Measuring Performance
- Concept 18: Validation and Test Set Size
- Concept 19: Quiz: Validation Set Size
- Concept 20: Validation Set Size Continued
- Concept 21: Optimizing a Logistic Classifier
- Concept 22: Stochastic Gradient Descent
- Concept 23: Momentum and Learning Rate Decay
- Concept 24: Parameter Hyperspace!
- Concept 25: Quiz: Mini-batch
- Concept 26: Epochs
- Concept 27: Lab: TensorFlow Neural Network
- Concept 28: Understanding Jupyter Workspaces
- Concept 29: Jupyter Workspace - TensorFlow
-
Lesson 03: Deep Neural Networks
Vincent walks you through how to go from a simple neural network to a deep neural network. You'll learn about why additional layers can help and how to prevent overfitting.
- Concept 01: Intro to Deep Neural Networks
- Concept 02: Quiz: Number of Parameters
- Concept 03: Linear Models are Limited
- Concept 04: Quiz: Rectified Linear Units
- Concept 05: Network of ReLUs
- Concept 06: 2-Layer Neural Network
- Concept 07: Quiz: TensorFlow ReLUs
- Concept 08: No Neurons
- Concept 09: The Chain Rule
- Concept 10: Backprop
- Concept 11: Deep Neural Network in TensorFlow
- Concept 12: Training a Deep Learning Network
- Concept 13: Save and Restore TensorFlow Models
- Concept 14: Finetuning
- Concept 15: Regularization Intro
- Concept 16: Regularization
- Concept 17: Regularization Quiz
- Concept 18: Dropout
- Concept 19: Dropout Pt. 2
- Concept 20: Quiz: TensorFlow Dropout
- Concept 21: Lab: Tensorflow Deep Neural Network
- Concept 22: Jupyter Workspace - DNN
-
Lesson 04: Convolutional Neural Networks
Vincent explains the theory behind Convolutional Neural Networks and shows you how to dramatically improve performance in image classification.
- Concept 01: Intro To CNNs
- Concept 02: Color
- Concept 03: Statistical Invariance
- Concept 04: Convolutional Networks
- Concept 05: Intuition
- Concept 06: Filters
- Concept 07: Feature Map Sizes
- Concept 08: Convolutions continued
- Concept 09: Parameters
- Concept 10: Quiz: Convolution Output Shape
- Concept 11: Solution: Convolution Output Shape
- Concept 12: Quiz: Number of Parameters
- Concept 13: Solution: Number of Parameters
- Concept 14: Quiz: Parameter Sharing
- Concept 15: Solution: Parameter Sharing
- Concept 16: Visualizing CNNs
- Concept 17: TensorFlow Convolution Layer
- Concept 18: Explore The Design Space
- Concept 19: TensorFlow Max Pooling
- Concept 20: Pooling Intuition
- Concept 21: Quiz: Pooling Mechanics
- Concept 22: Solution: Pooling Mechanics
- Concept 23: Pooling Practice
- Concept 24: Average Pooling
- Concept 25: 1x1 Convolutions
- Concept 26: Inception Module
- Concept 27: Convolutional Network in TensorFlow
- Concept 28: TensorFlow Convolution Layer
- Concept 29: Solution: TensorFlow Convolution Layer
- Concept 30: TensorFlow Pooling Layer
- Concept 31: Solution: TensorFlow Pooling Layer
- Concept 32: Lab: Tensorflow CNN
- Concept 33: GPU Workspace Introduction
- Concept 34: Jupyter Workspace - CNN
- Concept 35: CNNs - Additional Resources
- Concept 36: CNNs Wrapup
-
Lesson 05: Fully Convolutional Networks
In this lesson, you'll learn the motivation for Fully Convolutional Networks and how they are structured.
- Concept 01: Intro
- Concept 02: Why Fully Convolutional Networks (FCNs) ?
- Concept 03: Fully Convolutional Networks
- Concept 04: Fully Connected to 1x1 Convolution
- Concept 05: 1x1 Convolution Quiz
- Concept 06: 1x1 Convolution Quiz Solution
- Concept 07: Transposed Convolutions
- Concept 08: Transposed Convolution Quiz
- Concept 09: Transposed Convolution Quiz Solution
- Concept 10: Skip Connections
- Concept 11: FCNs In The Wild
- Concept 12: Outro
-
Lesson 06: Lab: Semantic Segmentation
In this lesson you'll be introduced to the problem of Scene Understanding and the role FCNs play.
- Concept 01: Intro
- Concept 02: Bounding Boxes
- Concept 03: Semantic Segmentation
- Concept 04: Scene Understanding
- Concept 05: Lab: Semantic Segmentation
- Concept 06: Lab: Getting Started
- Concept 07: Lab: Keras
- Concept 08: Lab: Encoder
- Concept 09: Lab: Batch Normalization
- Concept 10: Lab: Decoder
- Concept 11: Lab: Layer Concatenation
- Concept 12: IoU
- Concept 13: IoU Quiz
- Concept 14: Outro
- Concept 15: Jupyter Workspace - Semantic Segmentation
-
Lesson 07: Project: Follow Me
In this project, you'll build and train an FCN to find a specific person in images from a simulated quad-copter.
- Concept 01: Project Intro
- Concept 02: Setting up your Local Environment
- Concept 03: Working with the Simulator
- Concept 04: Collecting your Data
- Concept 05: Data Collection Guide
- Concept 06: Building your Segmentation Network
- Concept 07: Training and Validation
- Concept 08: Udacity GPU Workspace Introduction
- Concept 09: Udacity Workspace Best Practices
- Concept 10: Workspace - Follow Me
- Concept 11: AWS GPU Instance Preparation
- Concept 12: AWS GPU Instance Set Up
- Concept 13: Testing your Model in the Simulator
-
Lesson 08: Term 1 Outro
Wrapping up your first term!
-
-
Module 13: Introduction to C++ for Robotics
-
Lesson 01: Introduction to C++ for Robotics
In this lesson, you will learn the C++ you need to prepare you for your second term in the Robotics Nanodegree program.
- Concept 01: Introducing Karim
- Concept 02: Welcome!
- Concept 03: Overview
- Concept 04: Transitioning
- Concept 05: Ubuntu Setup
- Concept 06: Editor Choice
- Concept 07: Hello, World!
- Concept 08: Compile and Execute
- Concept 09: Functions and Data Structures
- Concept 10: Classes and Objects
- Concept 11: Inheritance and Pointers
- Concept 12: Template Classes
- Concept 13: External Libraries
- Concept 14: ROS Nodes
- Concept 15: Rover Control
- Concept 16: Challenge!
-
Part 02 : Term 2: Localization, Mapping, and Navigation
Learn to apply SLAM and reinforcement learning techniques for solving robotics problems.
-
Module 01: Introduction to Term 2
-
Lesson 01: Introduction to Term 2
Meet the Term 2 instructors and learn about the Term 2 outline.
-
Lesson 02: The Jetson TX2
Learn how to identify the key components of the Jetson TX2 and how to set it up.
-
Lesson 03: Interacting with Robotics Hardware
Learn about some basic hardware common to robotics including some specific information for use with the Jetson TX2.
-
Lesson 04: Lab: Hardware Hello World
Hands-on lab to control an LED from the Jetson TX2 command line, including how to calculate correct resistor values.
-
Lesson 05: Robotics Sensor Options
General information about the many types of sensors that are used in robotics systems including cameras, IMU, encoders, and more.
-
-
Module 02: Robotic Systems Deployment
-
Lesson 01: Inference Development
Learn how to use the Nvidia DIGITS tool to manage data and model training.
-
Lesson 02: Inference Applications in Robotics
Learn to evaluate real-time considerations when using an inference engine in robotics applications.
-
Lesson 03: Project: Robotic Inference
Design your own robotic system using inference, collect your own data set for classification, and justify network design choices based on your analysis
- Concept 01: Overview
- Concept 02: Practicing the DIGITS Workflow
- Concept 03: Picking your Robotic Inference Idea
- Concept 04: Collecting your own Data
- Concept 05: Importance of Documentation
- Concept 06: Optional: Deploying your Inference Project
- Concept 07: DIGITS Workspace
- Concept 08: Recap
- Concept 09: Common Questions
-
-
Module 03: Localization
-
Lesson 01: Introduction to Localization
Introduction to the localization concept and the algorithms
-
Lesson 02: Kalman Filters
Learn the Kalman Filter and Extended Kalman Filter Gaussian estimation algorithms.
- Concept 01: Overview
- Concept 02: What's a Kalman Filter?
- Concept 03: History
- Concept 04: Applications
- Concept 05: Variations
- Concept 06: Robot Uncertainty
- Concept 07: Kalman Filter Advantage
- Concept 08: 1D Gaussian
- Concept 09: Designing 1D Kalman Filters
- Concept 10: Measurement Update
- Concept 11: State Prediction
- Concept 12: 1D Kalman Filter
- Concept 13: Multivariate Gaussian
- Concept 14: Intro to Multidimensional KF
- Concept 15: Design of Multidimensional KF
- Concept 16: Introduction to EKF
- Concept 17: EKF
- Concept 18: EKF Example
- Concept 19: Recap
-
Lesson 03: Lab: Kalman Filters
Learn how to apply an EKF ROS package to a robot to estimate its pose.
- Concept 01: Introduction
- Concept 02: Sensor Fusion
- Concept 03: Catkin Workspace
- Concept 04: Udacity Workspace
- Concept 05: TurtleBot Gazebo Package
- Concept 06: Robot Pose EKF Package
- Concept 07: Odometry to Trajectory Package
- Concept 08: TurtleBot Teleop Package
- Concept 09: Rviz Package
- Concept 10: Main Launch
- Concept 11: Rqt Multiplot
- Concept 12: Outro
-
Lesson 04: Monte Carlo Localization
Learn the Monte Carlo Localization algorithm which uses particle filters to estimate a robot's pose.
-
Lesson 05: Build MCL in C++
Learn how to code the Monte Carlo Localization algorithm in C++.
- Concept 01: Introduction
- Concept 02: Robot Class
- Concept 03: First Interaction
- Concept 04: Motion and Sensing
- Concept 05: Noise
- Concept 06: Particle Filter
- Concept 07: Importance Weight
- Concept 08: Resampling
- Concept 09: Resampling Wheel
- Concept 10: Error
- Concept 11: Graphing
- Concept 12: Udacity Workspace
- Concept 13: Images
- Concept 14: Outro
-
Lesson 06: Project: Where Am I?
You will build your own mobile robot and use the Adaptive Monte Carlo Localization algorithm in ROS to estimate the robot’s pose.
- Concept 01: Overview
- Concept 02: Gazebo: Hello, world!
- Concept 03: Robot Model: Basic Setup
- Concept 04: Let There Be Sight!
- Concept 05: RViz Integration
- Concept 06: Localization: Map
- Concept 07: Localization: AMCL Package
- Concept 08: Localization: Parameter Tuning - 1
- Concept 09: Localization: Parameter Tuning - 2
- Concept 10: Launching and Testing
- Concept 11: Outro
- Concept 12: Common Questions
- Concept 13: Project Challenge
- Concept 14: Project Workspace
-
-
Module 04: Mapping and SLAM
-
Lesson 01: Introduction to Mapping and SLAM
Introduction to the Mapping and SLAM concepts, as well as the algorithms.
-
Lesson 02: Occupancy Grid Mapping
Learn how to map an environment with the Occupancy Grid Mapping algorithm.
- Concept 01: Introduction
- Concept 02: Importance of Mapping
- Concept 03: Challenges and Difficulties
- Concept 04: Mapping with Known Poses
- Concept 05: Posterior Probability
- Concept 06: Grid Cells
- Concept 07: Computing the Posterior
- Concept 08: Filtering
- Concept 09: Binary Bayes Filter Algorithm
- Concept 10: Occupancy Grid Mapping Algorithm
- Concept 11: Inverse Sensor Model
- Concept 12: Generate the Map
- Concept 13: Udacity Workspace
- Concept 14: Multi Sensor Fusion
- Concept 15: Introduction to 3D Mapping
- Concept 16: 3D Data Representations
- Concept 17: Octomap
- Concept 18: Outro
-
Lesson 03: Grid-based FastSLAM
Learn how to simultaneously map an environment and localize a robot relative to the map with the Grid-based FastSLAM algorithm.
- Concept 01: Introduction
- Concept 02: Online SLAM
- Concept 03: Full SLAM
- Concept 04: Nature of SLAM
- Concept 05: Correspondence
- Concept 06: SLAM Challenges
- Concept 07: Particle Filter Approach to SLAM
- Concept 08: Introduction to FastSLAM
- Concept 09: FastSLAM Instances
- Concept 10: Adapting FastSLAM to Grid Maps
- Concept 11: Grid-based FastSLAM Techniques
- Concept 12: The Grid-based FastSLAM Algorithm
- Concept 13: gmapping ROS Package
- Concept 14: Udacity Workspace
- Concept 15: SLAM with ROS
- Concept 16: Outro
-
Lesson 04: GraphSLAM
Learn how to simultaneously map an environment and localize a robot relative to the map with the GraphSLAM algorithm.
- Concept 01: Introduction
- Concept 02: Graphs
- Concept 03: Constraints
- Concept 04: Front-End vs Back-End
- Concept 05: Maximum Likelihood Estimation
- Concept 06: MLE Example
- Concept 07: Numerical Solution to MLE
- Concept 08: Mid-Lesson Overview
- Concept 09: 1-D to n-D
- Concept 10: Information Matrix and Vector
- Concept 11: Inference
- Concept 12: Nonlinear Constraints
- Concept 13: Graph-SLAM at a Glance
- Concept 14: Intro to 3D SLAM With RTAB-Map
- Concept 15: 3D SLAM With RTAB-Map
- Concept 16: Visual Bag-of-Words
- Concept 17: RTAB-Map Memory Management
- Concept 18: RTAB-Map Optimization and Output
- Concept 19: Outro
-
Lesson 05: Project: Map My World Robot
Deploy RTAB-Map on your simulated robot to create 2D and 3D maps of a predefined environment. Then create your own environment to map!
- Concept 01: Project Introduction
- Concept 02: Extending your Robot Creation - Sensor Upgrade
- Concept 03: Extending your Robot Creation - Launch Files
- Concept 04: Debugging in ROS - Transform Frames
- Concept 05: Debugging in ROS - roswtf
- Concept 06: Debugging in ROS - rqt common plugins
- Concept 07: RTAB-Map Visualization Tools - Database Viewer
- Concept 08: RTAB-Map Visualization Tools - rtabmapviz
- Concept 09: Final Considerations
- Concept 10: Going Above and Beyond - Localization
- Concept 11: Going Above and Beyond - Other RTAB-Map Features
- Concept 12: Ubuntu Install and Common Troubleshooting
- Concept 13: Udacity Workspace
- Concept 14: Common Questions
-
-
Module 05: Reinforcement Learning for Robotics
-
Lesson 01: Intro to RL for Robotics
Introduction to Reinforcement Learning for Robotics
-
Lesson 02: RL Basics
Learn the fundamentals of classic Reinforcement Learning
- Concept 01: Introduction
- Concept 02: Applications
- Concept 03: The Setting
- Concept 04: Reference Guide
- Concept 05: The Setting, Revisited
- Concept 06: Episodic vs. Continuing Tasks
- Concept 07: Quiz: Test Your Intuition
- Concept 08: Quiz: Episodic or Continuing?
- Concept 09: The Reward Hypothesis
- Concept 10: Goals and Rewards, Part 1
- Concept 11: Goals and Rewards, Part 2
- Concept 12: Quiz: Goals and Rewards
- Concept 13: Cumulative Reward
- Concept 14: Discounted Return
- Concept 15: Quiz: Pole-Balancing
- Concept 16: MDPs, Part 1
- Concept 17: MDPs, Part 2
- Concept 18: Quiz: One-Step Dynamics, Part 1
- Concept 19: Quiz: One-Step Dynamics, Part 2
- Concept 20: MDPs, Part 3
- Concept 21: Summary
- Concept 22: Policies
- Concept 23: Quiz: Interpret the Policy
- Concept 24: Gridworld Example
- Concept 25: State-Value Functions
- Concept 26: Bellman Equations
- Concept 27: Quiz: State-Value Functions
- Concept 28: Optimality
- Concept 29: Action-Value Functions
- Concept 30: Quiz: Action-Value Functions
- Concept 31: Optimal Policies
- Concept 32: Quiz: Optimal Policies
-
Lesson 03: Q-Learning Lab
Learn the Q-Learning algorithm and use it to solve problems in OpenAI Gym
-
Lesson 04: Deep RL
Learn how to combine neural networks with RL in the Deep Q-Network (DQN) algorithm
-
Lesson 05: DQN Lab
Learn to solve OpenAI Gym problems with the DQN algorithm
-
Lesson 06: Deep RL Manipulator
Learn to apply DQN to a robotic problem
-
Lesson 07: Project: Deep RL Arm Manipulation
Build a robotic arm that learns with Deep RL.
-
-
Module 06: Path Planning and Navigation
-
Lesson 01: Intro to Path Planning and Navigation
Learn what the lessons in Path Planning and Navigation will cover.
-
Lesson 02: Classic Path Planning
Learn a number of classic path planning approaches that can be applied to low-dimensional robotic systems.
- Concept 01: Introduction to Path Planning
- Concept 02: Examples of Path Planning
- Concept 03: Approaches to Path Planning
- Concept 04: Discrete Planning
- Concept 05: Continuous Representation
- Concept 06: Minkowski Sum
- Concept 07: Quiz: Minkowski Sum
- Concept 08: Minkowski Sum C++
- Concept 09: Translation and Rotation
- Concept 10: 3D Configuration Space
- Concept 11: Discretization
- Concept 12: Roadmap
- Concept 13: Visibility Graph
- Concept 14: Voronoi Diagram
- Concept 15: Cell Decomposition
- Concept 16: Approximate Cell Decomposition
- Concept 17: Potential Field
- Concept 18: Discretization Wrap-Up
- Concept 19: Graph Search
- Concept 20: Terminology
- Concept 21: Breadth-First Search
- Concept 22: Depth-First Search
- Concept 23: Uniform Cost Search
- Concept 24: A* Search
- Concept 25: Overall Concerns
- Concept 26: Graph-Search Wrap-Up
- Concept 27: Discrete Planning Wrap-Up
-
Lesson 03: Lab: Path Planning
Learn to code the BFS and A* algorithms in C++.
-
Lesson 04: Sample-Based and Probabilistic Path Planning
Learn about sample-based and probabilistic path planning and how they can improve on the classic approach.
- Concept 01: Introduction to Sample-Based & Probabilistic Path Planning
- Concept 02: Why Sample-Based Planning
- Concept 03: Weakening Requirements
- Concept 04: Sample-Based Planning
- Concept 05: Probabilistic Roadmap (PRM)
- Concept 06: Rapidly Exploring Random Tree Method (RRT)
- Concept 07: Path Smoothing
- Concept 08: Overall Concerns
- Concept 09: Sample-Based Planning Wrap-Up
- Concept 10: Introduction to Probabilistic Path Planning
- Concept 11: Markov Decision Process
- Concept 12: Policies
- Concept 13: State Utility
- Concept 14: Value Iteration Algorithm
- Concept 15: Probabilistic Path Planning Wrap-Up
-
Lesson 05: Research in Navigation
Learn about recent research that applies deep learning to robotic navigation problems.
-
Lesson 06: Project: Home Service Robot
Program a home service robot that will autonomously map an environment and navigate to pickup and deliver objects.
- Concept 01: Introduction
- Concept 02: Working Environment
- Concept 03: Udacity Workspace
- Concept 04: Shell Scripts
- Concept 05: Catkin Workspace
- Concept 06: Building Editor
- Concept 07: Testing SLAM
- Concept 08: Wall Follower
- Concept 09: Testing Navigation
- Concept 10: Reaching Multiple Goals
- Concept 11: Modeling Virtual Objects
- Concept 12: Putting it all Together
-
-
Module 07: Career Services
-
Lesson 01: Strengthen Your Online Presence Using LinkedIn
Find your next job or connect with industry peers on LinkedIn. Ensure your profile attracts relevant leads that will grow your professional network.
- Concept 01: Get Opportunities with LinkedIn
- Concept 02: Use Your Story to Stand Out
- Concept 03: Why Use an Elevator Pitch
- Concept 04: Create Your Elevator Pitch
- Concept 05: Use Your Elevator Pitch on LinkedIn
- Concept 06: Create Your Profile With SEO In Mind
- Concept 07: Profile Essentials
- Concept 08: Work Experiences & Accomplishments
- Concept 09: Build and Strengthen Your Network
- Concept 10: Reaching Out on LinkedIn
- Concept 11: Boost Your Visibility
- Concept 12: Up Next
-
Lesson 02: Optimize Your GitHub Profile
Other professionals are collaborating on GitHub and growing their network. Submit your profile to ensure your profile is on par with leaders in your field.
- Concept 01: Prove Your Skills With GitHub
- Concept 02: Introduction
- Concept 03: GitHub profile important items
- Concept 04: Good GitHub repository
- Concept 05: Interview with Art - Part 1
- Concept 06: Identify fixes for example “bad” profile
- Concept 07: Quick Fixes #1
- Concept 08: Quick Fixes #2
- Concept 09: Writing READMEs with Walter
- Concept 10: Interview with Art - Part 2
- Concept 11: Commit messages best practices
- Concept 12: Reflect on your commit messages
- Concept 13: Participating in open source projects
- Concept 14: Interview with Art - Part 3
- Concept 15: Participating in open source projects 2
- Concept 16: Starring interesting repositories
- Concept 17: Next Steps
-
-
Module 08: Completing the Program
-
Lesson 01: Completing the Program
Congratulations! You've reached the end of the Robotics Software Engineer Nanodegree program! Read on to learn how to officially complete the program and graduate.
-
Part 03 (Elective): Optional Kuka Path Planning Project
Solve a maze with path-planning and run it in simulation and hardware on a Kuka arm in the Kuka Challenge!
-
Module 01: Optional Kuka Path Planning Project
-
Lesson 01: Project Introduction
Plan a path through a maze for an industrial Kuka Manipulator Arm, then watch your code run on real hardware!
-
Lesson 02: Project Details
Helpful instruction to help you get started with the project
- Concept 01: Project Specification
- Concept 02: Getting Started
- Concept 03: Scoring Criteria
- Concept 04: Path Planning
- Concept 05: Project Workspace
- Concept 06: Submission Instructions
- Concept 07: Project Walkthrough
- Concept 08: Hints
- Concept 09: Maze #1 Leaderboard
- Concept 10: Maze #2 Leaderboard
- Concept 11: Contest Maze Leaderboard
-
Part 04 (Elective): Autonomous Systems Interview
Start off with some tips on interviewing for an autonomous systems role, then watch how candidates approach their interview questions. Finish off by practicing some questions of your own!
-
Module 01: Autonomous Systems Interview
-
Lesson 01: Autonomous Systems Interview Practice
Start off with some tips on interviewing for an autonomous systems role, then watch how candidates approach their interview questions. Finish off by practicing some questions of your own!
Project Description - Autonomous Systems Interview Practice Project
Project Rubric - Autonomous Systems Interview Practice Project
- Concept 01: Welcome to the Course!
- Concept 02: Job Titles
- Concept 03: Your Part
- Concept 04: One Piece of the Puzzle
- Concept 05: Job Descriptions
- Concept 06: Research the Company
- Concept 07: Let's Get Started
- Concept 08: Perception Engineer
- Concept 09: Perception Engineer Reflection
- Concept 10: Deep Learning Engineer
- Concept 11: Deep Learning Engineer Reflection
- Concept 12: Motion Planning Engineer
- Concept 13: Motion Planning Engineer Reflection
- Concept 14: Mapping/Localization Engineer
- Concept 15: Mapping/Localization Engineer Reflection
- Concept 16: Control Engineer
- Concept 17: Control Engineer Reflection
- Concept 18: My Own Project
- Concept 19: My Own Project Reflection
- Concept 20: Additional Resources to Consider
- Concept 21: Final Thoughts
- Concept 22: Project Instructions
- Concept 23: Perception/Sensor Questions
- Concept 24: Deep Learning Questions
- Concept 25: Motion Planning Questions
- Concept 26: Mapping/Localization Questions
- Concept 27: Control Questions
- Concept 28: General Questions
-